Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MeshToLevelSet Rendering Fixes #5923

Merged
merged 1 commit into from
Jul 9, 2024

Conversation

danieldresser-ie
Copy link
Contributor

Artists at ImageEngine were confused why they couldn't render the output from MeshToLevelSet in Arnold.

I started by not discarding the exception message when it fails to render, which revealed the exception: "LookupError: Cannot find metadata file_mem_bytes".

I think we can reasonably assume that any vdb loaded from disk should have this stats information, so there are two reasonable solutions: either make sure every volume produced in Gaffer has these stats computed, or make sure rendering falls back to using the memUsage() call that computes it if the metadata isn't found. I've currently gone with the first one, since there might be more of a possible performance hazard with the second ( there might be some sort of instancing setup where you could end up calling memUsage() many times? ), but I haven't thought too hard about it, and there might be an argument for the second approach.

The obvious next step is that if people actually want to render the result of these operations as volumes, then they need a way to convert them to fog volumes rather than level sets. We should probably have a LevelSetToVolume node - but it would be pretty wasteful to cache both versions, so there probably ought to be a MeshToVolume node as well ( which does what the current MeshToLevelSet does, and then calls openvdb::LevelSetUtil::sdfToFogVolume in place on it ).

@johnhaddon
Copy link
Member

Welcome back Daniel, and thanks for looking into this!

Artists at ImageEngine were confused why they couldn't render the output from MeshToLevelSet in Arnold.

Reading through the code, I'm not sure it was ever the intention to fail to render grids which didn't have the metadata. I suspect the intention was just to print a warning when we couldn't figure out a size estimate, but to render regardless. And the fact that gridsToWrite was incomplete in this case was just an oversight.

I think we can reasonably assume that any vdb loaded from disk should have this stats information

The existence of openvdb::io::Archive::setGridStatsMetadataEnabled() suggests that it is at least possible to write a file without the stats.

make sure every volume produced in Gaffer has these stats computed...I've currently gone with the first one

I was confused by this, since you've only modified MeshToLevelSet, and I couldn't see any existing code to do the same in LevelSetOffset or PointsToLevelSet. To confuse me further, testing with the SceneInspector suggested that those nodes do generate the metadata despite not containing a call to addStatsMetadata() and no such call existing in the underlying VDB functions used by the nodes. But I think the SceneInspector was misleading. It seems that IECoreVDBObject::medatata() calls addStatsMetadata() "just in time" and that is how the SceneInspector was getting it? And IECoreArnold::VDBAlgo doesn't get it because it goes direct to the grid metadata rather than via VDBObject::metadata().

I'm not actually sure how valid it is to use file_mem_bytes to estimate the required buffer size. Looking at the VDB source, what it actually contains is the in-memory footprint of the grid, so if there's any compression involved in writing to the stream (and I think there is - blosc, right?) then it'll be an over-estimate.

And since the addStatsMetadata() documentation says "This metadata is not automatically kept up-to-date with changes to this grid", I wonder if it's even a user expectation that it would be up-to-date in Gaffer. Certainly the only call to addStatsMetadata() in VDB itself is in the file writing code. Does anyone know if other apps continually update it?

there might be an argument for the second approach.

I definitely don't think we should be requiring the metadata to exist to render. And since the metadata may not be up to date, and doesn't even measure the required stream size anyway, I question whether we should use it at all. I'd also advocate for the following :

  1. We should treat VDB metadata in Gaffer how we treat image metadata - just along for the ride, with no impact on any operations, and with no automatic updates.
  2. VDBObject::metadata() should stop calling addStatsMetadata(), because it looks for all the world like it is just a convenience function for converting the existing metadata to Cortex Data. Mutating the underlying data is surprising, and I suspect may also be a thread-safety and/or cache-immutability issue in Gaffer.
  3. SphereLevelSet should stop callingaddStatsMetadata().

Seems like a good topic for our catch up today...

@danieldresser-ie
Copy link
Contributor Author

Updated this PR to implement your suggestion that Arnold rendering should not depend on the metadata.

@johnhaddon
Copy link
Member

Rebased to fix conflict, and reworded Changes.md to be suitably user-facing. Merging.

@johnhaddon johnhaddon merged commit 73ad0fd into GafferHQ:1.4_maintenance Jul 9, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants